SpinalNet: Deep Neural Network With Gradual Input

نویسندگان

چکیده

Deep neural networks (DNNs) have achieved the state of art performance in numerous fields. However, DNNs need high computation times, and people always expect better a lower computation. Therefore, we study human somatosensory system design network (SpinalNet) to achieve higher accuracy with fewer computations. Hidden layers traditional NNs receive inputs previous layer, apply activation function, then transfer outcomes next layer. In proposed SpinalNet, each layer is split into three splits: 1) input split, 2) intermediate 3) output split. Input receives part inputs. The outputs current number incoming weights becomes significantly than DNNs. SpinalNet can also be used as fully connected or classification DNN supports both learning learning. We observe significant error reductions computational costs most Traditional on VGG-5 provided state-of-the-art (SOTA) QMNIST, Kuzushiji-MNIST, EMNIST (Letters, Digits, Balanced) datasets. ImageNet pre-trained initial SOTA STL-10, Fruits 360, Bird225, Caltech-101 scripts training are available at following link: https://github.com/dipuk0506/SpinalNet Impact Statement—Research deep has gained attention from industries academia due their eye-catching performance. enabled machines perform myriad tasks that once only humans could do. Several researchers recently different types accuracy. recent success biologically inspired convolutional miraculous spinal architecture motivated us develop gradual superior several After first online appearance version this paper, applied new datasets reported promising results. may novel applications upcoming years.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Short term electric load prediction based on deep neural network and wavelet transform and input selection

Electricity demand forecasting is one of the most important factors in the planning, design, and operation of competitive electrical systems. However, most of the load forecasting methods are not accurate. Therefore, in order to increase the accuracy of the short-term electrical load forecast, this paper proposes a hybrid method for predicting electric load based on a deep neural network with a...

متن کامل

Gradual Learning of Deep Recurrent Neural Networks

Deep Recurrent Neural Networks (RNNs) achieve state-of-the-art results in many sequence-to-sequence tasks. However, deep RNNs are difficult to train and suffer from overfitting. We introduce a training method that trains the network gradually, and treats each layer individually, to achieve improved results in language modelling tasks. Training deep LSTM with Gradual Learning (GL) obtains perple...

متن کامل

Quantum-Inspired Neural Network with Sequence Input

To enhance the approximation and generalization ability of artificial neural network (ANN) by employing the principles of quantum rotation gate and controlled-not gate, a quantum-inspired neuron with sequence input is proposed. In the proposed model, the discrete sequence input is represented by the qubits, which, as the control qubits of the controlled-not gate after being rotated by the quant...

متن کامل

Neural Network Regression with Input Uncertainty

It is generally assumed when using Bayesian inference methods for neural networks that the input data contains no noise or corruption. For real-world (errors in variable) problems this is clearly an unsafe assumption. This paper presents a Bayesian neural network framework which allows for input noise given that some model of the noise process exists. In the limit where this noise process is sm...

متن کامل

Gradual Tuning: a better way of Fine Tuning the parameters of a Deep Neural Network

In this paper we present an alternative strategy for fine-tuning the parameters of a network. We named the technique Gradual Tuning. Once trained on a first task, the network is fine-tuned on a second task by modifying a progressively larger set of the network’s parameters. We test Gradual Tuning on different transfer learning tasks, using networks of different sizes trained with different regu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE transactions on artificial intelligence

سال: 2022

ISSN: ['2691-4581']

DOI: https://doi.org/10.1109/tai.2022.3185179